# Multilingual Instruction Fine-tuning

Granite 3.3 2b Instruct GGUF
Apache-2.0
IBM-Granite's 2-billion parameter instruction model supporting multilingual and long-context tasks with structured reasoning capabilities.
Large Language Model
G
lmstudio-community
444
2
Mistral Small 3.1 24B Instruct 2503 GGUF
Apache-2.0
Mistral-Small-3.1-24B-Instruct-2503 is a 24B-parameter multilingual instruction fine-tuned model that supports multiple languages and tasks, suitable for text generation and dialogue scenarios.
Large Language Model Supports Multiple Languages
M
second-state
1,059
1
Granite 3.0 8b Instruct
Apache-2.0
Granite-3.0-8B-Instruct is an 8-billion-parameter model fine-tuned from Granite-3.0-8B-Base, trained using a combination of various open-source instruction datasets and internally synthesized datasets.
Large Language Model Transformers
G
ibm-granite
24.29k
201
Gemma 2 2b Jpn It
Gemma 2 JPN is a Japanese-text fine-tuned version of the Gemma 2 2B model, excelling in Japanese language processing and suitable for various text generation tasks.
Large Language Model Transformers Japanese
G
google
7,510
183
Llamantino 3 ANITA 8B Inst DPO ITA
LLaMAntino-3-ANITA is a multilingual (English + Italian) large language model based on Meta Llama 3, specifically optimized for Italian NLP tasks.
Large Language Model Transformers Supports Multiple Languages
L
swap-uniba
6,401
25
Mixtral 8x22B Instruct V0.1
Apache-2.0
Mixtral-8x22B-Instruct-v0.1 is a large language model fine-tuned for instructions based on Mixtral-8x22B-v0.1, supporting multiple languages and function calling capabilities.
Large Language Model Transformers Supports Multiple Languages
M
mistralai
12.80k
723
Calme 7B Instruct V0.2
Apache-2.0
Calme-7B is a 7-billion-parameter language model fine-tuned based on Mistral-7B, excelling in generating clear, calm, and coherent text.
Large Language Model Transformers
C
MaziyarPanahi
15
14
Mistral 7B Instruct Aya 101
Apache-2.0
A multilingual instruction-following model fine-tuned based on Mistral-7B-Instruct-v0.2, supporting 101 languages
Large Language Model Transformers Supports Multiple Languages
M
MaziyarPanahi
92
12
Aya 101
Apache-2.0
Aya 101 is a large-scale multilingual generative language model supporting instructions in 101 languages, outperforming similar models in various evaluations.
Large Language Model Transformers Supports Multiple Languages
A
CohereLabs
3,468
640
Mixtral 8x7B Instruct V0.1
Apache-2.0
Mixtral-8x7B is a pre-trained generative sparse mixture of experts model that outperforms Llama 2 70B on most benchmarks.
Large Language Model Transformers Supports Multiple Languages
M
mistralai
505.97k
4,397
Flan T5 Xl
Apache-2.0
FLAN-T5 XL is an instruction-finetuned language model based on the T5 architecture, with significantly improved multilingual and few-shot performance after fine-tuning on 1000+ tasks.
Large Language Model Supports Multiple Languages
F
google
257.40k
494
Flan T5 Base
Apache-2.0
FLAN-T5 is a language model optimized through instruction fine-tuning based on the T5 model, supporting multilingual task processing and outperforming the original T5 model with the same parameter count.
Large Language Model Supports Multiple Languages
F
google
3.3M
862
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase